MustGather for IBM InfoSphere Information Server

Versions:   8.5.x, 8.7.x, 9.1.x, 11.3.x, 11.5.x, 11.7

on Windows, IBM AIX, Sun Solaris, HP-UX, Linux RedHat, Linux SUSEx, and zLinux platforms
where supported

Updated: 20 December 2017

This document describe how to collect troubleshooting data for problems with IBM InfoSphere Information Server and its components. Collecting data before opening a PMR helps IBM Support quickly determine if:

Collecting Information Server information

To provide IBM Support with sufficient information to debug a problem, send them the following collection requirements:

Depending on the product and the Information Server component that you are using, there may be product-specific or component-specific information that you will also need to collect.

For the data and files collection tasks, assume the following default folder locations. Actual folder locations may differ:

Path Name

Description

Linux or UNIX Default

Windows Default

IS_HOME

Folder where Information Server is installed.

/opt/IBM/InformationServer

C:\IBM\InformationServer 

WAS_HOME

Folder where IBM WebSphere Application Server is installed.

/opt/IBM/WebSphere/AppServer (for WAS) or
/opt/IBM/InformationServer/wlp (for WAS Liberty)
C:\IBM\WebSphere\AppServer (for WAS) or
C:\IBM\InformationServer\wlp (for WAS Liberty)
WAS_IS_PROFILE Information Server Profile folder under  WebSphere Application Server WAS_HOME/profiles/default  or
WAS_HOME/profiles/InfoSphere  (for WAS)
WAS_HOME/usr/server (for WAS Liberty)

 

WAS_HOME\profiles\default or
WAS_HOME\profiles\InfoSphere  (for WAS)
WAS_HOME\usr\server (for WAS Liberty)
WAS_DMGR_PROFILE
(not meaningful in WAS Liberty)
Deployment Manager Profile folder under clustered WebSphere Application Server

/opt/IBM/WebSphere/AppServer/profiles/Dmgr01

 

C:\IBM\WebSphere\AppServer\profiles\Dmgr01

WAS_HTTP_SERVER HTTP Server and dispatcher in a clustered WebSphere Application Server /opt/IBM/HTTPServer C:\IBM\HTTPServer
WAS_HTTP_PLUGIN HTTP Server Plugin in a clustered WebSphere Application Server /opt/IBM/WebSphere/Plugins C:\IBM\WebSphere\Plugins
DB2_HOME

Folder where DB2 is installed.

/opt/IBM/db2/V9   [for DB2 ver. 9]
/opt/IBM/db2/V10  [for DB2 ver. 10]

 

C:\IBM\SQLLIB

USER_HOME

User home folder

/home/<username> C:\Documents and Settings\<username>

Windows 7 or higher, and Windows 64 bits:
C:\Users\<username>

Or use the environment variable %USERPROFILE%
Note: Some folders under the users’ folders may be hidden
CRED_USER_HOME

Credential Mapped User home folder

/home/<credential_mapped_user> C:\Documents and Settings\<credential_mapped_user>
JREPORT_HOME Folder where JReport artifacts are located Information Server 8.5/8.7/9.1:
<TEMP>/InformationServer/Reporting<Host><Node><Server>
/engine/JReport

Information Server 11.3 and above:
<TEMP>/informationServer/Reporting*/engine/JReport

Information Server 8.5/8.7/9.1:
<TEMP>\InformationServer\Reporting<Host><Node><Server>
\engine\JReport

Information Server 11.3 and above:
<TEMP>\informationServer\Reporting*\engine\JReport

ORACLE_HOME

Oracle client folder

/opt/oracle/product/10.2.0  [for Oracle ver. 10.2]

C:\oracle\product\10.2.0\client_1

TEMP System temporary folder /tmp  or  /var/tmp C:\Documents and Settings\<user>\Local Settings\Temp

 

Index of Topics
Information Server general information
1. General information about the problem encountered
2. General Hardware, Software, and environment information to collect
3. General Information Server information and common files to collect

 
Database information
4. DB2 and XMeta database information
 
IBM WebSphere Application Server information
5. WebSphere Application Server information and files to collect
Information Server Components
6. Information Server Installation files to collect
7. DataStage (DS) information and files to collect
8. QualityStage (QS) information and files to collect
9. Information Analyzer (IA) information and files to collect
10. Business Glossary (BG) information and files to collect
11. Information Services Director (ISD) information and files to collect
12. FastTrack (FT) information and files to collect
13. Reporting Framework information and files to collect

14. PX Engine information and files to collect
15. XMeta
16. XMeta Metabrokers and Bridges
17. Information Server auditing files to collect
18. SAP packs
19. Connectivity:
20. Balanced Optimization
21. Migration files to collect
22. XML Transformation information and files to collect
23. OPS Console Workload Manager

24. Data Flow Designer (DFD)
25. Unified Governance (UG)
 
FTP Collected data to IBM
What to do next: FTP Collected data to IBM

 

1. General information about the problem encountered

Return to Index

Provide IBM Support with enough information to identify the scenario that will help troubleshoot the problem. The following list are examples of some helpful information:

 

2. General hardware, software, and environment information to collect

Return to Index

Gather the following information for each host or client machine:

Use the following commands to gather the information. Read the operating system manual pages and product documentation for usage and other information, if necessary.

AIX

Command

Collects

prtconf Machine hardware information, machine model, CPUs, network
uname -a or oslevel -q operating system version
/usr/sbin/instfix -ivq operating system patches or maintenance level
env operating system environment variable
errpt operating system configuration and error reporting
/usr/sbin/lsattr -E -l sys0 operating system kernel parameters
ulimit -a operating system current ulimit (system resources) settings
ps -ef > ps.log operating system processes, saved in ps.log file
mount List of the mounted file systems with location of scratch and permanent disk resources
df -g Description of the amount of space available on each mounted file system
ifconfig -a Network interfaces on the system, including the associated IP address
hostname Fully-qualified DNS name for this system
Collect:  /etc/services
          /etc/hosts
          /etc/profile
          /etc/security/limits.conf
System configuration files

Linux

Command

Collects

uname -a operating system version
/bin/rpm -qa operating system patches or maintenance level
env operating system environment variable
/bin/dmesg operating system configuration and error reporting
/sbin/sysctl -a operating system kernel parameters
ulimit -a operating system current ulimit (system resources) settings
ps -ef > ps.log operating system processes, saved in ps.log file
mount List of the mounted file systems with location of scratch and permanent disk resources
df -a -k Description of the amount of space available on each mounted file system
ifconfig -a Network interfaces on the system, including the associated IP address
hostname -f Fully-qualified DNS name for this system
Collect:  /etc/services
          /etc/hosts
          /etc/profile
          /etc/security/limits.conf
System configuration files

Solaris

Command

Collects

prtdiag Machine hardware information, CPUs
uname -a operating system version
showrev operating system level summary
patchadd -p operating system System patches
env operating system environment variable
sysdef operating system System Definition
ulimit -a operating system current ulimit (system resources) settings
ps -ef > ps.log operating system processes, saved in ps.log file
mount List of the mounted file systems with location of scratch and permanent disk resources
df -k Description of the amount of space available on each mounted file system
ifconfig -a Network interfaces on the system, including the associated IP address
hostname DNS name for this system
/usr/bin/dmesg operating system configuration and error reporting
Collect:  /etc/services
          /etc/hosts
          /etc/profile
          /etc/security/limits.conf
System configuration files

HP-UX

Command

Collects

model Machine hardware model
uname -a operating system version
swlist operating system Software installed
env operating system environment variable
sysdef operating system System Definition
ulimit -f operating system current ulimit (system resources) settings
ps -ef > ps.log operating system processes, saved in ps.log file
showmount -e List of the mounted file systems with location of scratch and permanent disk resources
bdf Description of the amount of space available
ifconfig lo0 Network interfaces on the system, including the associated IP address
hostname DNS name for this system
Collect:  /etc/services
          /etc/hosts
          /etc/profile
          /etc/security/limits.conf
System configuration files

Windows

Command

Collects

systeminfo Hardware information summary
winver or ver operating system version
winmsd   or
start /wait msinfo32.exe /category SystemSummary+SWEnv /report summary.txt
operating system patches or maintenance level.
Creates summary.txt file with system information
set operating system environment variable
  • cscript C:\windows\system32\eventquery.vbs /L application /R 1000 /V
  • cscript C:\windows\system32\eventquery.vbs /L system /R 1000 /V
  • cscript C:\windows\system32\eventquery.vbs /L security /R 1000 /V
operating system configuration and error reporting
REG QUERY HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall /S Software installed in the system
netstat -an netstat.log Network connections
Collect:
   
<drive>:\WINDOWS\system32\drivers\etc\hosts

    <drive>:\WINDOWS\system32\drivers\etc\services 
System configuration file

 

3. General Information Server information and common files to collect

Return to Index

Most of the Information Server server components run under WebSphere Application Server and might use the DB2 database to store data and information. Also, many components share the same log files to log events. Therefore, there are several files and artifacts that contain information provided by multiple components. Some of these common files are described here, for easy collection. Others are described under the particular Information Server component area. Files that are only provided by an Information Server component are described under that component's area.

You must log in as an Administrator (in Windows) or root (in Linux, Solaris or AIX) before running the commands described in this document.

Directory listing
In order to diagnose permission issues to files and directories, collect the following directory listings, with owner and group permission settings:

Directory listing only

Description

<IS_HOME>\ASBServer\*      [only directory listing] Listing of all files under folder and subfolders, with owner and group permission settings
<IS_HOME>\ASBNode\*        [only directory listing]
<TEMP>\*                   [only directory listing]

Files and Data to collect

Files and Data to collect

Description

<IS_HOME>\Version.xml

Provides information about Information Server components installed on machine, version, DB2 and WebSphere Application Server installation details

<IS_HOME>\buildinfo.txt Provides build version information
<IS_HOME>\ASBServer\etc\jacc.config JACC provider configuration file
<IS_HOME>\ASBServer\conf\* ASBServer installation configuration files, including SSL keys
<IS_HOME>\ASBServer\apps\lib\*.properties. Property files that are in the Application Server classpath
<IS_HOME>\ASBServer\apps\lib\ojb-conf.jar A central jar file for XMeta containing various details about deployed models as well as XMeta POJO connection parameters.
<IS_HOME>\ASBServer\bin\startMetadataServer.log Contains trace entries from starting WAS during a system startup.
<IS_HOME>\ASBNode\*.log All logs and error logs

<IS_HOME>\ASBNode\orbtrc*

 

<IS_HOME>\ASBNode\orbmsg*

 

<IS_HOME>\ASBNode\bin\orbtrc*

 

<IS_HOME>\ASBNode\bin\orbmsg*

 

<IS_HOME>\ASBNode\*.err

 

<IS_HOME>\ASBNode\*.out

 

<IS_HOME>\ASBNode\core*

 

<IS_HOME>\ASBNode\logging-agent-buffer.ser

Serialized log events

<IS_HOME>\ASBNode\bin\*.log

 

<IS_HOME>\ASBNode\bin\*.err

 

<IS_HOME>\ASBNode\bin\*.out

 

<IS_HOME>\ASBNode\bin\logging-agent-buffer.ser

Serialized log events

<IS_HOME>\ASBNode\javacore*
<IS_HOME>\ASBNode\heapdump*
<IS_HOME>\ASBNode\bin\javacore*
<IS_HOME>\ASBNode\bin\heapdump*
Java Core and heap dump files, created for OutOfMemoryErrors: there may be several, collect the most recent ones
Information Server 8.5 only  
<IS_HOME>\ASBNode\eclipse\plugins\ com.ibm.isf.client_8.5.0.0\registered-servers.xml registered-servers.xml file
Information Server 8.7 and above  
<IS_HOME>\ASBNode\eclipse\plugins\ com.ibm.isf.client\registered-servers.xml registered-servers.xml file
Agent's logs  
<IS_HOME>\ASBNode\logs\asb-agent*.out
<IS_HOME>\ASBNode\logs\*.pid
<IS_HOME>\ASBNode\logs\*.properties
<IS_HOME>\ASBNode\logs\*\*.log
<IS_HOME>\ASBNode\logs\*\*.log.pos
<IS_HOME>\ASBNode\logs\*\*.log.err
Log files from Logging Agent:
*.log* files only from subfolders under \logs
Migration logs  
<IS_HOME>\Migration\logs\* Migration log files

Windows registry

The following Windows registry keys are used by Information Server and its components:

HKEY_LOCAL_MACHINE\SOFTWARE\IBM\InformationServer
HKEY_LOCAL_MACHINE\SOFTWARE\Ascential Software
HKEY_LOCAL_MACHINE\SOFTWARE\Mortice Kern Systems
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\SubSystems
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Memory Management
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters

Use the following commands to collect Windows registry entries specific to Information Server. Some of the keys might not exist, depending on the Information Server components that were installed.

Command

REG QUERY HKLM\SOFTWARE\IBM\InformationServer /S
REG QUERY "HKLM\SOFTWARE\IBM\Ascential Software" /S
REG QUERY "HKLM\SOFTWARE\Mortice Kern Systems" /S
REG QUERY "HKLM\SYSTEM\CurrentControlSet\Control\Session Manager\SubSystems" /S
REG QUERY "HKLM\SYSTEM\CurrentControlSet\Control\Session Manager\Memory Management" /S
REG QUERY HKLM\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters /S

 

4. DB2 and XMeta database information

Return to Index

Use the following commands to invoke the standard DB2 collector tool (db2support tool):

Command

Collects

Windows:
  
cd <DB2_HOME>\bin
   db2cmd  db2support . -f -d XMeta
where XMeta is the name of the Information Server XMeta database

AIX, Solarix, Linux, or HP-UX:
   cd <DB2_HOME>/bin
   ./db2support . -f -d XMeta
where XMeta is the name of the Information Server XMeta database

Creates a zip file with collected data in <DB2_HOME>/bin/db2support.zip. Contains additional statistics of the XMeta database

The db2support tool creates zip file <DB2_HOME>/bin/db2support.zip that contains the following information:

5. WebSphere Application Server information and files to collect

Return to Index

The commands presented in this section invoke the standard IBM WebSphere Application Server collector tool that creates a zip file containing the following information:

Make a copy of the profile data from each server in the node, from the profile where Information Server was installed, which includes the following artifacts where Information Server data and logs are found.

Files and Data to collect Description

<WAS_IS_PROFILE>/logs/*

All files contained in the /logs folder and its subfolders

<WAS_IS_PROFILE>/properties/portdef.props
<WAS_IS_PROFILE>/logs/portdef.props
 

<WAS_IS_PROFILE>/classes/*.properties
<WAS_IS_PROFILE>/classes/ojb-conf.jar

 

<WAS_IS_PROFILE>/javacore*
<WAS_IS_PROFILE>/heapdump*

Java Core and Heap dump files created by OutOfMemoryErrors. There may be several but collect the most recent ones

Non Clustered WAS environment Description
<WAS_IS_PROFILE>/config/cells/<hostname>Cell01/security.xml
<WAS_IS_PROFILE>/config/cells/<hostname>Cell01/
     nodes/<hostname>Node01/serverindex.xml
<WAS_IS_PROFILE>/config/cells/<hostname>Cell01/
     nodes/<hostname>Node01/servers/
     <servername>/server.xml
 
Clustered WAS environment (IS 8.5 and above) Description
Gather from every <PROFILE> folder: both the <WAS_DMGR_PROFILE> and every <WAS_NODE_PROFILE> in cluster:

<PROFILE>/config/cells/<hostname>Cell01/security.xml
<PROFILE>/config/cells/<anyFolder>/*.xml
<PROFILE>/config/cells/<anyFolder>/jacc.config
<PROFILE>/config/cells/<anyFolder>/clusters/<anyFolder>/*.xml
<PROFILE>/config/cells/<anyFolder>/nodes/<anyFolder>*/*.xml
<PROFILE>/config/cells/<anyFolder>/nodes/<anyFolder>/servers/<anyFolder>/*.xml
<PROFILE>/config/cells/<anyFolder>/wim/<anyFolderAndSubFolders>/*.xml
On the DMgr node and also on every node in the cluster, as well for each Server in the Node
IBM WebSphere Application Server version 8 and above Description
<WAS_IS_PROFILE>/logs/server1/TextLog_xxxxxx.log
<WAS_IS_PROFILE>/logs/server1/logdata/*.*
<WAS_IS_PROFILE>/logs/server1/tracedata/*.*
WAS version 8 High Performance Extensible Logging (HPEL) logs
IBM WebSphere Application Server Liberty Description
<WAS_IS_PROFILE>/mis/logs/*.log
<WAS_IS_PROFILE>/mis/server.xml
<WAS_IS_PROFILE>/mis/bootstrap.properties
WAS Liberty log and configuration files

Collect all WebSphere Application Server artifacts as shown in the following tables:

AIX, Solarix, Linux, or HP-UX

Command

Collects

  • Verify that the path contains the following system directories:
    /bin
    /sbin
    /usr/bin
    /usr/sbin
  • Run the following command from inside a temporary directory (outside of the WAS_HOME folder)

<WAS_HOME>/bin/collector.sh -verbosity 5 -JarOut CRMxxxxx_Collection_<timestamp>.jar

add the parameter -profileName profile_name

 

 

 

Standard WebSphere Application Server data, profiles, and log collector. Produces CRMxxxxx_Collection_<timestamp>.jar file

Collects artifacts under the specified WebSphere Application Server profileName rather than the default profile

Windows

Command

Collects

  • Include regedit in the path
  • Run the following command from inside a temporary directory (outside of the WAS_HOME folder)

<WAS_HOME>\bin\collector.bat -verbosity 5 -JarOut CRMxxxxx_Collection_<timestamp>.jar

add the parameter -profileName profile_name

 

Standard WebSphere Application Server data, profiles, and log collector. Produces CRMxxxxx_Collection_<timestamp>.jar file
Collects artifacts under the specified
WebSphere Application Server profileName rather than the default profile

WebSphere thread dump

Procedure

To generate a WebSphere thread dump, go to <WAS_IS_PROFILE>\bin folder (e.g. C:\IBM\WebSphere\AppServer\profiles\default\bin) in a command window and invoke the following commands:

wsadmin
set objectName [$AdminControl queryNames WebSphere:type=JVM,process=server1,*]
$AdminControl invoke $objectName dumpThreads
quit

The thread dump file will be in the WebSphere profile directory <WAS_IS_PROFILE> and will have a name like javacore.xxxxxxxx.xxxxxx.xxxx.txt

 

5.1 Front end dispatcher: HTTP server and WebSphere Application Server plugin

Return to Index

A WebSphere Application Server clustered environment (InfoSphere Information Server versions 8.5 and above) usually is configured with a front end dispatcher such as an HTTP server, and a WebSphere HTTP server plugin.

Files and Data to collect Description

<WAS_HTTP_SERVER>/conf/httpd.conf
<WAS_HTTP_SERVER>
/logs/*

WebSphere HTTP server config file and logs

<WAS_HTTP_PLUGIN>/config/*/plugin-cfg.xml
<WAS_HTTP_PLUGIN>/logs/*/*.log
WebSphere HTTP server plugin config file and logs

 

6. Information Server installation files to collect

Return to Index

AIX, Solarix, Linux, or HP-UX

Files and Data to collect

Description

<TEMP>/ibm_is_logs

Information Server installation logs

<TEMP>/log.txt

WebSphere Application Server installation log

<IS_HOME>/logs

Information Server installation logs

<IS_HOME>/Server/StagingArea/Log

Component Installer logs

<IS_HOME>/response.txt

Response file used or generated by the Information Server installation

Information Server Patches history  
<IS_HOME>/Updates/<patchName>/Previous.Version.xml <IS_HOME>/Updates/<patchName>/Customer.Patchmanifest.xml Version.xml and Patch Manifest before a patch was installed
Information Server In Progress Install history  
<IS_HOME>/_uninstall/inprogress/*.* Files generated by an in progress installation

Windows

Files and Data to collect

Description

<TEMP>\ibm_is_logs

Information Server installation logs

<TEMP>\log.txt

WebSphere Application Server installation log

<IS_HOME>\logs

Information Server installation logs

<IS_HOME>\Server\StagingArea\Log

Component Installer logs

<IS_HOME>\response.txt

Response file used or generated by the Information Server

Information Server Patches history  
<IS_HOME>\Updates\<patchName>\Previous.Version.xml
<IS_HOME>\Updates\<patchName>\Customer.Patchmanifest.xml
Version.xml and Patch Manifest before a patch was installed
Information Server In Progress Install history  
<IS_HOME>\_uninstall\inprogress\*.* Files generated by an in progress installation

 

7. DataStage (DS) information and files to collect

Return to Index

AIX, Solarix, Linux, or HP-UX

Files and Data to collect

Description

Server Machine

 

<USER_HOME>/ds_logs/*.*

Not always present.

DataStage SERVER Machine

 

<CRED_USER_HOME>/ds_logs/*.*

All files contained in the ds_logs directory.

<IS_HOME>/Server/DSEngine/dsenv

 

<IS_HOME>/Server/DSEngine/uvconfig  
<IS_HOME>/Server/DSEngine/.odbc.ini
or the file pointed by the environment variable ODBCINI
ODBC.ini file
<IS_HOME>/Server/DSEngine/errlog  
<IS_HOME>/Server/DSEngine/DSAuditTrace.log  
<IS_HOME>/Server/PXEngine/etc/jobmon_ports  
<IS_HOME>/Server/PXEngine/java/JobMonApp.log.nnn All historical logs
<IS_HOME>/Server/PXEngine/java/JobMonApp.log Current log
<IS_HOME>/Server/PXEngine/etc/operator.apt*  

<IS_HOME>/Server/DSEngine/orbtrc.<DATE>.txt
<
IS_HOME>/Server/DSEngine/orbmsg.<DATE>.txt

Not always present

<IS_HOME>/Server/DSEngine/bin/orbtrc.<DATE>.txt
<
IS_HOME>/Server/DSEngine/bin/orbmsg.<DATE>.txt

Not always present

<IS_HOME>/Server/Projects/_Project_Name_/&COMO&/*.*

Only present if tracing is enabled in the Administration client

<IS_HOME>/Server/Projects/_Project_Name_/heapdump.*
<IS_HOME>/Server/Projects/_Project_Name_/javacore.*
<IS_HOME>/Server/Projects/_Project_Name_/Snap.*
<IS_HOME>/Server/Projects/_Project_Name_/orbtrc.<DATE>.txt
<IS_HOME>/Server/Projects/_Project_Name_/orbmsg.<DATE>.txt
Heapdump, java core, and orb log files in the project

Windows

Files and Data to collect

Description

Client Machine

 

 <USER_HOME>\ds_logs\*.*

DataStage logs from the client user folders

 <IS_HOME>\Clients\Classic\orbtrc.<DATE>.txt

Not always present

 <IS_HOME>\Clients\Classic\orbmsg.<DATE>.txt

Not always present

 <IS_HOME>\Clients\Classic\javacore* Not always present
 <IS_HOME>\Clients\Classic\Snap*.trc Not always present
<USER_HOME>\Application Data\IBM\Information Server\DataStage Client\
<ClientTagID>\ErrorReports\*.zip
The ClientTagID is located in the <IS_HOME>\Version.xml file,
for example:
<InstallType client="true"
clientTagId="652e77e8-ea1e-4850-a8b2-11e6a33dd884"
currentVersion="8.5.0.0" document="false" domain="true"
engine="true" repository="true"/>

Server machine

 

 <USER_HOME>\ds_logs\*.*

Not always present.

DataStage SERVER Machine

 

<CRED_USER_HOME>\ds_logs\*.*

Credential-mapped user logs

<IS_HOME>\Server\DSEngine\uvconfig  
<IS_HOME>\Server\DSEngine\.odbc.ini  
<IS_HOME>\Server\DSEngine\javacore*
<IS_HOME>\Server\DSEngine\heapdump*
Java Core and heap dump files, created by OutOfMemoryErrors: there may be several, collect the most recent ones
<IS_HOME>\Server\PXEngine\etc\jobmon_ports  
<IS_HOME>\Server\PXEngine\java\JobMonApp.log.nnn All historical logs
<IS_HOME>\Server\PXEngine\java\JobMonApp.log Current log
<IS_HOME>\Server\etc\operator.apt*  

<IS_HOME>\Server\DSEngine\orbtrc.<DATE>.txt
<
IS_HOME>\Server\DSEngine\orbmsg.<DATE>.txt

Not always present

<IS_HOME>\Server\DSEngine\bin\orbtrc.<DATE>.txt
<
IS_HOME>\Server\DSEngine\bin\orbmsg.<DATE>.txt

Not always present

<IS_HOME>\Server\projects\_Project_Name_\&COMO&\*.*

Only present if tracing is enabled in the Administration client

<IS_HOME>\Server\Projects\_Project_Name_\heapdump.*
<IS_HOME>\Server\Projects\_Project_Name_\javacore.*
<IS_HOME>\Server\Projects\_Project_Name_\Snap.*
<IS_HOME>\Server\Projects\_Project_Name_\orbtrc.<DATE>.txt
<IS_HOME>\Server\Projects\_Project_Name_\orbmsg.<DATE>.txt
Heapdump, java core, and orb log files in the project

DataStage Operations Console (Information Server 8.7 and above only)

Files and Data to collect

Description

<IS_HOME>/Server/DSODB/*.cfg
<IS_HOME>/Server/DSODB/logs/AppWatcher*.log
<IS_HOME>/Server/DSODB/logs/JobRuntime.err
<IS_HOME>/Server/DSODB/logs/JobRuntime.log
<IS_HOME>/Server/DSODB/logs/EngMonApp*.log
<IS_HOME>/Server/DSODB/logs/ResMonApp*.log
<IS_HOME>/Server/DSODB/logs/handler*.log
<IS_HOME>/Server/DSODB/logs/odbqueryapp*.log
<IS_HOME>/Server/DSODB/logs/odbqapp*.log
<IS_HOME>/Server/PXEngine/DSResourceTracker/Tracker*.log
<IS_HOME>/ASBNode/asbagent_startup.out
<IS_HOME>/ASBNode/asbagent_startup.err
<IS_HOME>/ASBNode/bin/Agent.out
<IS_HOME>/ASBNode/bin/LoggingAgent.out
<WAS_IS_PROFILE>/logs/datastage_console_web.log
<WAS_IS_PROFILE>/logs/datastage_api_restservices.log
<WAS_IS_PROFILE>/logs/datastage_api_services.log

Log and configuration files used by DataStage Operations Console

<IS_HOME>/Server/DSODB/events/<YYYYMMDDHH_date>/*

Collect all files inside the most recent folder. Folder name is in YYYYMMDDHH format, for example '2011092813'

DataStage Operational Metadata (Information Server 11.3 and above only)

Files and Data to collect

Description

<IS_HOME>/Server/DSOMD/logs/*.log

Log files used by DataStage Operational Metadata

<IS_HOME>/Server/DSOMD/xml/*.xml*

XML files used by DataStage Operational Metadata

Collecting DataStage specific information

DataStage can create logging and tracing information at multiple layers and in different components:

  1. Client-Server tracing

  2. Client-JuggerNet Proxy Tracing

  3. Client-Spy Tracing

  4. Increase Log Detail

  5. Trace File Location

DataStage Client-Server Tracing

This traces calls made from the DataStage clients (such as Designer) to the DSServer engine helper programs. Tracing can be turned on at the project level. When tracing is turned on, all clients that attach to that project produce a large trace file. Also, each time a client attaches to DataStage, a warning message displays indicating that tracing is turned on. Click OK  to proceed.

Command

  1. Start up Administrator.

  2. Navigate to Project > Properties > Tracing.

  3. Ensure that Server Side Tracing is enabled.

  4. Open the Designer. A message displays that tracing is turned on. Click OK.

  5. Replicate the steps in the application that produce the error that you want to report to IBM Support

  6. Close the Designer.

  7. In the Administrator, uncheck the Server Side Tracing flag again (so you do not leave it on by mistake).

  8. Ensure that the project directory on the server "&COMO&". i.e., C:\IBM\InformationServer\Server\Projects\yourproject\&COMO&  now contains a DSR_TRACE_username-number file.

Files and Data to collect
<IS_HOME>\Server\projects\_Project_Name_\&COMO&\*.*

 

DataStage Client-JuggerNet Proxy Tracing

Use the JuggerNet Proxy Tracing to detect Codemesh problems. JuggerNet Proxy Tracing is set on a per-client basis.

Command

Set the following environment variables:
    XMOG_TRACE_LEVEL=TraceVerbose
    XMOG_TRACE_FILE=C:\xmogtrace.txt

Files and Data to collect

C:\xmogtrace.txt

 

DataStage Client-Spy Tracing

The Spy trace shows calls made from the Java layer of the client. It is turned on for individual clients.

Command

  1. Create a file named newrepos.debug.properties that contains the following environment variable:  
    NewRepos.spy.trace=true

  2. Save the file on the client machine in the  <IS_HOME>\ASBNode\conf  folder.

The logs are written to <USER_HOME>\ds_logs\dstage_wrapper_spy_NN.log where NN is a number from 1 to 20.

Files and Data to collect

<USER_HOME>\ds_logs\dstage_wrapper_spy_NN.log     where NN is a number from 1 to 20

 

 Increase Log Detail

 Increase the detail included in the client side log files.

Command

Create a file named newrepos.debug.properties that contains the following environment variables:  

  • log4j.logger.com.ascential.dstage.impactanalysis=DEBUG

  • log4j.logger.com.ascential.dstage.advancedfind=DEBUG

  • log4j.logger.com.ascential.dstage.proxies=DEBUG

  • log4j.logger.com.ibm.datastage.repository=DEBUG

Files and Data to collect

<USER_HOME>\ds_logs\*.*

 

DataStage Trace File Location

For InfoSphere Information Server 8.5 through 9.1.x, trace files may be generated in the current folder when some DataStage tools are run. If present, these trace files should be included in the data gathered for IBM Support.

Command

The following commands may generate log files and atrifacts to collect:
    DSXImportService
    dsjobs
    ISTools

Files and Data to collect

orbtrc.<DATE>.txt from the current folder

orbmsg.<DATE>.txt from the current folder

 

8. QualityStage (QS) information and files to collect

Return to Index

For server job run problems

Commands

  1. Export to a dsx file the job that is failing or has produced unexpected output (including rule sets and match specifications if used).

  2. Provide the input to the above job.

  3. Extract DS log to the above job (from DS Director).

  4. Provide the QS job artifacts using the following method:
    - Ascertain the DS job # by using DS Director>View>Detail (Job no.).
    - Provide the contents of
    <IS_HOME>\Server\Projects\{ProjectName}\RT_QS{JobNo} folder.

  5. If the job is run in multi-node, collect information about the number of nodes that are used.

 For GUI problems

Commands

  1. Export to a dsx file the job that is failing or has produced unexpected output (including rule sets and match specifications if used).
  2. Provide screenshots of any error messages.
  3. Provide a replication scenario.

QualityStage Standardization Rules Designer (Information Server 9.1)

Files and Data to collect

Description

<IS_HOME>/Server/Projects/{ProjectName}/RT_QS{JobNo}/*

Collect all files inside the project folder and subfolder with the Job number

<TEMP>/RulesConsoleCache/*

Only on the Engine tier. Collect all files contained in this directory and its subdirectories. They are QualityStage standardized rule-set control files used within QualityStage Standardization Rules Designer

 

9. Information Analyzer (IA) information and files to collect

Return to Index

All platform users

Files and Data to collect

Description

<WAS_IS_PROFILE>/iasHandler.log

Records agent activity

<WAS_IS_PROFILE>/iasServer.log
<WAS_IS_PROFILE> / iasServer-server*.log
<WAS_IS_PROFILE> / iasHandler-server*.log
<WAS_IS_PROFILE> / ascl-customattributes-server*.log
<WAS_IS_PROFILE> / datastage_api_services-server*.log
<WAS_IS_PROFILE> / xmeta-server*.log

Records IA service activity

Client - Windows only  
<USER_HOME>\Local Settings\Application Data\IBM\IBM Information Server Console\isc.log

Windows 7 or higher, and Windows 64 bits  (Note: folders under the users' directory may be hidden):
<USER_HOME>\AppData\Local\IBM\IBM Information Server console\isc.log
or
%USERPROFILE%\AppData\Local\IBM\IBM Information Server console\isc.log
Records the Information Server Console (ISC.exe) client activity
<USER_HOME>\Local Settings\Application Data\IBM\IBM Information Server Console\session.log

Windows 7 or higher, and Windows 64 bits (Note: folders under the users' directory may be hidden):
<USER_HOME>\AppData\Local\IBM\IBM Information Server Console\session.log
or
%USERPROFILE%\AppData\Local\IBM\IBM Information Server Console\session.log

Records the Information Server Console (ISC.exe) client activity for the session

<IS_HOME >\Clients\ISC\orbtrc.<DATE>.txt

Not always present

<IS_HOME>\Clients\ISC\orbmsg.<DATE>.txt

Not always present

<IS_HOME>\Clients\ISC\javacore* Not always present
   
Client Tier - Windows only  
  <IS_HOME>\ASBNode\lib\java\iaapi_buildinfo_client.txt Build Information
Server - Services Tier  
  <WAS_HOME>\profiles\InfoSphere\installedApps\<NodeCell>\IAServices.ear.ear\META-INF\iaapi_buildinfo_services.txt  
Server - Engine Tier  
  <IS_HOME>\ASBNode\lib\java\iaapi_buildinfo_engine.txt  

Collecting Information Analyzer specific information

AIX, Solarix, Linux, or HP-UX

Command

Collects

<IS_HOME>/ASBServer/bin/ServiceAdmin.sh -user ISadmin -password ISadminpass -ls

Installed services

<IS_HOME>/ASBServer/bin/ReportingAdmin.sh -user ISadmin -password ISadminpass -t -li

Installed report templates

Windows

Command

Collects

<IS_HOME>\ASBServer\bin\ServiceAdmin.bat -user ISadmin -password ISadminpass -ls

Installed services

<IS_HOME>\ASBServer\bin\ReportingAdmin.bat -user ISadmin -password ISadminpass -t -li

Installed report templates

 

10. Information Governance Catalog (IGC) [Business Glossary] information and files to collect

Return to Index

In IBM InfoSphere Information Server version 11.3, the Business Glossary components were combined and renamed to Information Governance Catalog.

All platform users.

Files and Data to collect

Description

Server machine - Information Server 8.7 and above  
<WAS_IS_PROFILE>/logs/bgapi*.log Service logs
<WAS_IS_PROFILE>/logs/bg*.log  

Server machine -  Information Server 11.3 and above, using WAS ND

 

<WAS_IS_PROFILE>/logs/igc.log* Information Governance Catalog log files
<WAS_IS_PROFILE>/logs/igc-admin-services.log* Information Governance Catalog log files
<WAS_IS_PROFILE>/logs/messages.log Information Governance Catalog log file
<WAS_IS_PROFILE>/logs/omd-importer.log* Operational MetaData log files

Server machine -  Information Server 11.3 and above, using WAS Liberty

 

<WAS_IS_PROFILE>/iis/gov.xml WAS Liberty configuration file
<WAS_IS_PROFILE>/iis/110gov/iis-igc-roles.jar
<WAS_IS_PROFILE>/iis/110gov/iis-bdc-roles.jar
Role definition file

Business Glossary Anywhere Client (Information Server 8.5-9.1)

Files and Data to collect

Description

Server

 

<WAS_IS_PROFILE>/logs/GlossaryBrowser.log Business Glossary Browser and Business Glossary Anywhere application logs
<WAS_IS_PROFILE>/installedApps/<Node01Cell>/ GlossaryBrowser.ear.ear/BGWeb.war/WEB-INF/classes/resources/build.info Business Glossary Browser and Business Glossary Anywhere Server version number
Client  
<USER_HOME>\Application Data\ IBM\BusinessGlossaryAnywhere\user.config

Windows 7 or higher, and Windows 64 bits:
<USER_HOME>\AppData\Roaming\IBM\
BusinessGlossaryAnywhere\user.config

Client specific configurations
<USER_HOME>\Local Settings\Application Data\ IBM\BusinessGlossaryAnywhere\Logs\BusinessGlossaryAnywhere <yyyymmdd>.<hhmmss>.log

Windows 7 or higher, and Windows 64 bits:
<USER_HOME>\AppData\Roaming\IBM\BusinessGlossaryAnywhere\
Logs\BusinessGlossaryAnywhere<yyyymmdd>.<hhmmss>.log

<USER_HOME> is the name of the user where Business Glossary Anywhere is installed.

 

11. Information Services Director (ISD) information and files to collect

Return to Index

All platform users

Files and Data to collect

Description

Server

 

<IS_HOME>/ASBNode/conf/*

 

   
Client - Windows only  
<USER_HOME>\Local Settings\Application Data\IBM\IBM Information Server Console\isc.log

Windows 7 or higher, and Windows 64 bits
(Note: folders under the users' directory may be hidden):
<USER_HOME>\AppData\Local\IBM\IBM Information Server console\isc.log
or
%USERPROFILE%\AppData\Local\IBM\IBM Information Server console\isc.log

Records the Information Server Console (ISC.exe) client activity
<USER_HOME>\Local Settings\Application Data\IBM\IBM Information Server Console\session.log

Windows 7 or higher, and Windows 64 bits
(Note: folders under the users' directory may be hidden):

<USER_HOME>\AppData\Local\IBM\IBM Information Server console\session.log
or
%USERPROFILE%\AppData\Local\IBM\IBM Information Server console\session.log

Records the Information Server Console (ISC.exe) client activity for the session

 <IS_HOME>\Clients\ISC\orbtrc.<DATE>.txt

Not always present

 <IS_HOME>\Clients\ISC\orbmsg.<DATE>.txt

Not always present

Collecting Information Services Director specific information

AIX, Solarix, Linux, or HP-UX

Command

Collects

<IS_HOME>/ASBNode/bin/AgentConfig.sh -user ISadmin -password ISadminpass -listAllAgents

Installed services

<IS_HOME>/ASBNode/bin/AgentConfig.sh -user ISadmin -password ISadminpass -listAllRouters

Installed report templates

Windows

Command

Collects

<IS_HOME>\ASBNode\bin\AgentConfig.bat -user ISadmin -password ISadminpass -listAllAgents

Installed services

<IS_HOME>\ASBNode\bin\AgentConfig.bat -user ISadmin -password ISadminpass -listAllRouters

Installed report templates

 

12. FastTrack (FT) information and files to collect

Return to Index

Windows only users

Files and Data to collect

Description

Client Machine, Information Server 8.5

 

<USER_HOME>\fasttrack-workspace_8.5\.metadata\.log Client Eclipse log file

Client Machine, Information Server 8.7

 

<USER_HOME>\fasttrack-workspace_8.7\.metadata\.log Client Eclipse log file

Client Machine, Information Server 9.1 and above

 

<USER_HOME>\fasttrack-workspace\.metadata\.log Client Eclipse log file

 

13. Reporting framework information and files to collect

Return to Index

Reporting framework is part of the Information Services Framework (ISF) common services that provides life-cycle management for all Reporting artifacts such as deployment, report creation/modification, report execution/scheduling and viewing of report results.

All platform users

Files and Data to collect

Description

<WAS_IS_PROFILE>\logs\*.*

All the WebSphere Application Server log files

<WAS_IS_PROFILE>\javacore.XXX.txt
<WAS_IS_PROFILE>\heapdump.XXX.phd

Java Core and heap dump files created by  OutOfMemoryErrors: there may be several, collect the most recent ones

   

SERVER Machine and
WebSphere Application Server Cluster Nodes

 

<JREPORT_HOME>\logs\*.*
<JREPORT_HOME>\bin\*.*

JReport log files.
Note: In a WebSphere Application Server clustered environment, JReport log files may be found on each remote machine where a cluster node resides.

 

Collecting Reporting Framework specific information

AIX, Solarix, Linux, or HP-UX

Command

Collects

ulimit -a > ulimit.out Captures the current ulimit settings.

 

14. PX Engine information and files to collect

Return to Index

PX configuration files
The PX configuration file describes the machine resources available to the PX engine, including the number of nodes (physical and virtual), and scratch and permanent disk resources, etc.

A default configuration file is created at installation time, located at:
<IS_HOME>/Server/Configurations/default.apt

A job can override this configuration file location by setting the environment variable APT_CONFIG_FILE (recommended by the ACG Best Practices). This allows the file to be placed anywhere in the file system. However, most installations store configuration files in IS_HOME/Server/Configurations. Since configuration files are small, collect all of the files in IS_HOME/Server/Configurations.


Exported job
The Designer allows a job (such as a failing job) to be exported to a .dsx file. IBM Support uses this file to understand the problem or reproduce it. You must specify the name of the .dsx file at the time of the export.


Director log
The log for a failing job should be printed to a file using the Director's log print dialog. You must specify the name of the file to be produced when you print the log. The "detailed" option must be specified.


Debugging directory
If a job fails consistently, and you can edit the job, collect a Debugging directory for the job. To create this directory:

Commands

  1. Set the following environment variables:
    DS_PXDEBUG=1
    APT_DUMP_SCORE=1
  2. Re-run the job
Files and Data to collect
A directory is produced in IS_HOME/<project>/Debugging/<job name>. This directory contains every artifact created in the process of running the job, including:
  • osh script for the job
  • osh command used to run the osh script
  • Script that invokes the osh
  • Settings of all of the environment variables for the job
  • Job parameter file
  • PX configuration file
  • Transforms used by the job:
    - S
    ource code for each transform
    - osh script used to compile each transform, including the input and output schemas
    - Compiled object code for each transform
  • Raw job log

Create an archive of this directory using either tar (UNIX) or zip (Windows) and send it to IBM Support.

Collecting PX Engine specific information

AIX, Solarix, Linux, or HP-UX

Command

Collects

ls -ltr IS_HOME/Server/PXEngine/lib

File sizes and modification dates of the PX Engine shared libraries.
strings IS_HOME/Server/PXEngine/lib/*.so | grep '\$Version' | sort -u Lists the embedded compile date, time and compilation host name. This is useful for verifying patch levels

c++ -v

Version information about the C++ compiler

 

15. XMeta

Return to Index

All platform users

This document describes problem scenarios that can occur when using XMeta. For each scenario, the typical characteristics or symptoms of the scenario are described and a list of Must Gather information is provided.

Scenario: Database Deadlock

A database deadlock occurs when two (or more) database transactions each hold a resource that the other is attempting to acquire. Database deadlocks are automatically detected by the database and at least one of the transactions involved in the deadlock is terminated. The JDBC driver returns an exception in response to the termination. XMeta detects this exception and returns a DatabaseDeadlockException to the application.
It is important to recognize that in any non-trivial, multi-user database-based system, deadlocks will occur. While XMeta makes every effort to reduce the likelihood of deadlocks, there are certain scenarios where deadlocks cannot be avoided.
Characteristics

The most obvious symptom of a database deadlock is that XMeta will throw a DatabaseDeadlockException. This exception is logged in SystemOut.log or SystemErr.log. A message from DB2 is similar to:

Caused by: com.ibm.db2.jcc.c.SqlException: DB2 SQL error: SQLCODE: -911, SQLSTATE: 40001, SQLERRMC: 2

The database deadlock is recognized by the SQLCode: -911 and SQLERRMC: 2. If the SQLERRMC is a value other than 2, then this is not a true deadlock. It could be a lock timeout (SQLERRMC: 68 - see the following section).

In addition to the messages in the log file, the client application will report a failing operation. Some clients will display the error to the user (for example, DataStage Designer), some clients will have automatic retry logic (for example, IA), and some clients will show the same exception to their clients (for example, ISF services).
Files and Data to collect
  1. All WebSphere Application Server logs
  2. Database configuration settings
  3. DB2 diagnostic log
  4. Database deadlock detail
  5. Database statement snapshot
  6. Client logs for any client who sees a deadlock error.

 

Scenario: Database Lock Timeout

A database lock timeout occurs when a transaction is waiting to acquire a lock on a resource and another transaction is holding the lock. If the waiting transaction waits for too long, the database will abort the transaction. Some databases (for example, DB2) report this situation using the same error codes as a deadlock. As a result, XMeta interprets this as a deadlock and returns a DatabaseDeadlockException.
Characteristics
The database lock timeout symptoms are similar to the deadlock symptoms. The only significant difference from the deadlock scenario is that the reason code SQLERRMC is 68 (instead of 2):

Caused by: com.ibm.db2.jcc.c.SqlException: DB2 SQL error: SQLCODE: -911, SQLSTATE: 40001, SQLERRMC: 68

Since XMeta throws a DatabaseDeadlockException, the client responses to a lock timeout are the same as a deadlock (see the previous section).
Files and Data to collect
  1. All WebSphere Application Server logs
  2. Database configuration settings
  3. DB2 diagnostic log
  4. Database statement snapshot
  5. If the lock timeouts are a consistent problem, also capture the following information:
    1. A series of database lock snapshots taken at semi-regular intervals when the lock timeouts occur.
    2. A database application snapshot for any applications that are shown in the UOW Waiting state in the lock snapshots.
  6. Client logs for any client who sees a deadlock error.

 

Scenario: Lock -up

A lock-up is a situation where client applications freeze-up and the users are unable to do anything. This can be one client application or it can be every client application using the server.
Characteristics
The primary characteristic of this problem is a client application that stops responding. The application may show an hourglass. The application either never resumes or resumes only after a long period of time (for example, 1 - 2 hours).
Files and Data to collect
  1. All WebSphere Application Server logs
  2. Database configuration settings
  3. DB2 diagnostic log
  4. Database statement snapshot
  5. A series of database lock snapshots taken at semi-regular intervals when the lock timeouts occur.
  6. A database application snapshot for any applications that are shown in the UOW Waiting state in the lock snapshots.
  7. Client logs for any client that was active at the time of the lock up.


Collecting XMeta specific information

Files and Data to collect

Description

ASBServer\bin\imam_upgrade.log XMeta log file
ASBServer\bin\xmetaAdmin*.log XMeta log file
ASBServer\conf\database.properties XMeta properties file
ASBServer\conf\imam_staging_repository.properties XMeta properties file
DB2 Configuration and logs

Description

In a DB2 prompt, run the commands:

  1. get dbm cfg show detail
  2. get db cfg for <db-name> show detail
DB2 Configuration Settings
[DB2_HOME]\<DatabaseInstance>\db2diag.log DB2 diagnostic log: for example C:\IBM\SQLLIB\DB2\db2diag.log
Deadlock details
DB2 automatically creates a deadlock monitor at database creation time. If this monitor is still active, then the deadlock detail files can be found in this location:
<DFTDBPATH>/<DatabaseInstance>/NODE0000 /SQL00001/db2event/db2detaildeadlock

If this monitor was dropped or deactivated, then create a new monitor in a DB2 command-line window as follows:
create event monitor deadlock_mon for connections, deadlocks with details write to file '<output-directory>' buffersize 8 blocked maxfiles 1 maxfilesize none
 
This monitor can be turned on and off as follows:
// Turn the monitor ON
set event monitor deadlock_mon state 1
 
// Turn the monitor OFF
set event monitor deadlock_mon state 0

 
After the deadlock occurs, extract the event details in text format as follows:
    
db2evmon -path <event-output-directory>
 Pipe the output of this command to a file to save the results in text format.

Monitor switches
The monitor logs are most useful if certain DB2 switches are enabled. These switches can be enabled or disabled in a DB2 command line prompt as follows:

DB2 UPDATE DBM CFG USING DFT_MON_UOW ON IMMEDIATE
DB2 UPDATE DBM CFG USING DFT_MON_STMT ON IMMEDIATE
DB2 UPDATE DBM CFG USING DFT_MON_TABLE ON IMMEDIATE
DB2 UPDATE DBM CFG USING DFT_MON_BUFPOOL ON IMMEDIATE
DB2 UPDATE DBM CFG USING DFT_MON_LOCK ON IMMEDIATE
DB2 UPDATE DBM CFG USING DFT_MON_SORT ON IMMEDIATE
DB2 UPDATE DBM CFG USING DFT_MON_TIMESTAMP ON IMMEDIATE
DB2 UPDATE DBM CFG USING HEALTH_MON ON IMMEDIATE

Restart DB2 for these to take effect:
   
db2stop force
  db2start

Lock Snapshot
Run the following commands in a DB2 command line prompt:
db2 connect to [db-name]
db2 get snapshot for locks on [db-name] > [output-file-name]

If necessary, reset the statistics, re-run the test code, and capture another snapshot. You can reset the statistics as follows:
db2 reset monitor for database [db-name]

In some cases, it may be useful to run more test code and capture another snapshot without resetting the statistics.
Statement Snapshot
Run the following commands in a DB2 command line prompt:
db2 connect to [db-name]
db2 get snapshot for dynamic sql on [db-name] > [output-file-name]


It is useful to reset statistics and/or capture multiple snapshots.
Application Snapshot
In some cases, you need to capture details about a particular connection to the database. DB2 refers to a connection as an application. First, identify the application ID. For effective debugging, in most cases where you need an application snapshot, you also need a lock snapshot. Take the lock snapshot first, identify which applications are in the UOW Waiting state, then take the application snapshots for those applications. In the lock snapshot, the application ID is referred as “Application Handle”:

Application handle                         = 175
Application ID                             = *LOCAL.db2ieimd.081008181114
Sequence number                            = 00001
... 

Run the following commands in a DB2 command line prompt:
db2 get snapshot for application applid [appl-id]
Statement Monitor
A statement monitor captures all of the individual statements that are executed against the database, in the order in which they were executed. The steps are similar to the deadlock monitor.

Run the following DB2 commands and procedures:

  1. Create the monitor:
    db2 create event monitor statement_mon for statements write to file '[output-directory]' buffersize 8 blocked maxfiles 1 maxfilesize none
     

  2. Turn on the monitor:
    db2 set event monitor statement_mon state 1
     

  3. Run the test code
     

  4. Turn off the monitor. This should create a (binary) .evt file
    db2 set event monitor statement_mon state 0
     

  5. Convert the file to text
    db2evmon -path [event-directory] > statements.txt

 

16. XMeta Metabrokers and Bridges

Return to Index

All platform users

Files and Data to collect

Description

Services (Domain) machine

 

<IS_HOME>\ASBServer\install\logs\xmeta-install.log

xmeta installation log

<IS_HOME>\ASBServer\install\etc\history\xmeta-install-status.hist

xmeta installation history file

<IS_HOME>\ASBServer\install\bin\install.properties

xmeta installation configuration file

 
Server (ie. DS engine) machine

 

<IS_HOME>\ASBNode\install\logs\xmeta-install.log

xmeta installation log

<IS_HOME>\ASBNode\install\etc\history\xmeta-install-status.hist

xmeta installation history file

<IS_HOME>\ASBNode\install\bin\install.properties

xmeta installation configuration file


Client machine

 

<IS_HOME>\ASBNode\install\logs\xmeta-install.log

xmeta installation log

<IS_HOME>\ASBNode\install\etc\history\xmeta-install-status.hist

xmeta installation history file

<IS_HOME>\ASBNode\install\bin\install.properties

xmeta installation configuration file

Windows

Set environment variable MIR_LOG_LEVEL to 6. All the debug information for Bridges will be captured in the log files as described below:

Files and Data to collect

Description

Client Machine

 

<USER_HOME>\dstage_wrapper_trace*.log Datastage logs.  <USER_HOME> is typically C:\Documents and Settings\<username>
<USER_HOME>\dstage_wrapper_spy*.log Datastage logs
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM WebSphere Metadata Server MetaBroker\IBMWDISPostprocessor.log Metabrokers and Bridges post processor Log File
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM WebSphere Metadata Server MetaBroker\IBMWDISPreprocessor.log Metabrokers and Bridges pre processor Log File
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM WebSphere Metadata Server MetaBroker\IBMWDISDecoderParamValues.xml

<IS_HOME>\Clients\MetaBrokersAndBridges\IBM WebSphere Metadata Server MetaBroker\IBMWDISEncoderParamValues.xml
Metabrokers and Bridges parameter values files
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM Rational Data Architect MetaBroker\IBMRDADecoderParamValues.xml Metabrokers and Bridges parameter values file
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM Rational Data Architect MetaBroker\*.log Metabrokers and Bridges Log File
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM Rational Data Architect MetaBroker\Installation directory Metabrokers and Bridges Log File
Client Machine, Information Server 8.7 - 9.1

 

<IS_HOME>\Clients\MetaBrokersAndBridges\IBM InfoSphere Data Architect MetaBroker\IBMIDADecoderParametersEnu.xml Metabrokers and Bridges parameter values file
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM InfoSphere Data Architect MetaBroker\*.log Metabrokers and Bridges Log File
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM WebSphere Metadata Server MetaBroker\ IBMWDISDecoderParametersEnu.xml Metabrokers and Bridges parameter values file
<IS_HOME>\Clients\MetaBrokersAndBridges\IBM WebSphere Metadata Server MetaBroker\ IBMWDISEncoderParametersEnu.xml Metabrokers and Bridges parameter values file
Client Machine, Information Server 11.3 and above

 

<IS_HOME>\Clients\MetaBrokersAndBridges\mis.log Metabrokers and Bridges Log File
<IS_HOME>\Clients\MetaBrokersAndBridges\Bridges*\conf\ MIRSetup.xml Metabrokers and Bridges Setup File
<IS_HOME>\Clients\MetaBrokersAndBridges\Bridges*\conf\ MIRModelBridges.xml Metabrokers and Bridges parameter values file
<IS_HOME>\Clients\MetaBrokersAndBridges\bin\mis.cfg Metabrokers and Bridges Configuration File
<TEMP>\*.xml Metabrokers and Bridges Configuration Files
<TEMP>\imam\**\*.xml Metabrokers and Bridges Configuration Files

To turn on additional debugging information in Metabrokers and Bridges, set the environment variable DEBUG_IBMWDIS to a user-defined string value that will be used as prefix in the filename of the output log files and then reproduce the problem. Additional debugging information will be produced in the following artifacts:

Files and Data to collect

Description

<TEMP>\<DEBUG_IBMWDIS>MIMB.log Metabrokers and Bridges Log File
<TEMP>\<DEBUG_IBMWDIS>XML_Decoder.xml_Dump.xml  
<TEMP>\<DEBUG_IBMWDIS>XML_Decoder.xml  
<TEMP>\<DEBUG_IBMWDIS>Filterhub*  
<TEMP>\<DEBUG_IBMWDIS>TempHub.hub  

As part of the collection for the XMeta component, also include the artifacts described in the DataStage paragraph, 7. DataStage (DS) information and files to collect

 

17. Information Server auditing files to collect

Return to Index

All platform users

Files and Data to collect

Description

Server machine

 

<WAS_IS_PROFILE>\logs\ISauditLog_0.log
<WAS_IS_PROFILE>\logs\ISauditLog_1.log

Audit log files (latest two instances) in the default location. See below for modified locations

Note:
The location and file names of the auditing files may have been modified. The path and filename can be determined by inspecting the
audit.file.path and audit.file.name keys in the ISauditing.properties file which is pointed to by the auditing.config.file key in <WAS_IS_PROFILE>/classes/isfconfig.properties file. These also will exist on each cluster member.

18. SAP packs

Return to Index

All platform users

Files and Data to collect

Description

Server (DataStage engine) machine

 

<IS_HOME>/Server/DSEngine/.dsrel DS Version
<IS_HOME>/Server/DSSAPbin/.dsSAPPackrel
Under Windows collect:
<
IS_HOME>\Server\DSSAPbin\dsSAPPackrel.txt
R3 Pack version (if file exists, then it is a valid R3 Pack installation)
<IS_HOME>/Server/DSBWbin/.dsSAPBWPackrel
Under Windows collect:
<
IS_HOME>\Server\DSSAPbin\dsSAPBWPackrel.txt
BW Pack version (if file exists, then it is a valid BW Pack installation). It is not present in BW Pack 4.3.1(bug)
<IS_HOME>/Server/DSEngine/bin/IWSDSPSAPR3*.SYS2
<
IS_HOME>/Server/DSEngine/bin/IWSDSPSAPBW*.SYS2
License files showing the installed R3/BW Pack version. Use the files as a flag for an existing installation, for example,
IWSDSPSAPR30600.SYS2 → R3 Pack 60
IWSDSPSAPBW0403.SYS2 → BW Pack 43

contents of folder indicated by the environment variable $RFC_TRACE_DIR


 

 

Folder where trace files are stored. Archive the entire folder. To enable trace generation, set the following environment variables before running the faulty operation. On the server, they should be set either in dsenv or in the Project settings using the DS Administrator:
RFC_TRACE=1
RFC_NO_COMPRESS=1
RFC_TRACE_DIR = folder_path

<IS_HOME>/Server/DSSAPConnections User data R3 Pack: archive the files included in this folder, and only the files found within its sub folders. Do not collect files found in deeper folder levels, as they may contain user sensitive data

<IS_HOME>/Server/DSBWConnections

User data BW Pack: archive the entire folder
   

Client machine

 

contents of folder indicated by the environment variable $RFC_TRACE_DIR


 

 

Folder where trace files are stored. Archive the entire folder. To enable trace generation, set the following environment variables before running the faulty operation. On an Information Server Client, set the following environment variables at the Windows system level:
RFC_TRACE=1
RFC_NO_COMPRESS=1
RFC_TRACE_DIR = folder_path 

 

19. Connectivity

Return to Index

19.1 ODBC Connector, ODBC Enterprise Stage, ODBC Stage

Use ODBC trace when a problem is encountered with a component that uses ODBC to get at data sources. This includes the following components:

Since collection of ODBC trace can affect performance, enable ODBC trace before you re-run a job or repeat a task. Disable ODBC trace as soon as possible.

Problems with metadata import in Information Analyzer

If you are collecting ODBC trace because of problems with metadata import in Information Analyzer, restart the ASB Agent service after enabling or disabling the trace.  To do so, follow these instructions:

Windows

Command

Description

Click on Control Panel > Administrative Tools > Services.
Find the ASB Agent service and click Stop.
To restart the ASB Agent, click Start.

To restart the ASB Agent, after enabling or disabling the trace

 AIX, Solarix, Linux, or HP-UX

Command

Collects

Log on as root
cd <InformationServer>/ASBNode/bin
To stop the ASB Agent, type:   ./NodeAgents.sh stop
To start the ASB Agent, type:   ./NodeAgents.sh start
To restart the ASB Agent, type:   ./NodeAgents.sh restart
 

Generate the ODBC Trace

Windows

Command

Description

  1. Click on  Control Panel >Administrative Tools > ODBC Data Source Administrator
  2. Select the Tracing tab
  3. Specify a file name in the Log File Path
  4. Click Start Tracing Now
  5. Execute the DataStage job or perform the task that requires tracing
  6. Click Stop Tracing Now
  7. Collect the trace file from the specified location
Generate and collect the ODBC trace

 AIX, Solarix, Linux, or HP-UX

Command

Description

  1. Log on as a DataStage administrator (for example, dsadm)
  2. cd $DSHOME
  3. Edit the .odbc.ini file
  4. Locate the [ODBC] section in .odbc.ini, which is typically at the bottom of the file.
    Modify the following properties to enable tracing:
    Trace=1
    TraceFile=Trace file location and name (for example, /tmp/odbctrace.txt)
  5. Execute the DataStage job or perform the task that requires tracing
  6. Edit .odbc.ini again, and set Trace=0
  7. Collect the trace file from the specified location
Generate and collect the ODBC trace

 

19.2 Oracle Connector, Oracle Enterprise Stage, Oracle OCI Plugin, DRS Plugin

Return to Index

To provide IBM Support with sufficient information to debug a problem with Oracle Connectivity, the following are the minimal collection requirements that should be provided. These requirements are in addition to the general collection requirements discussed in Collecting Information Server information: minimal requirements.

DataStage job information

Follow the instructions in section 8. QualityStage (QS) information and files to collect to collect DataStage job information. The following job data is needed:

Data to collect

Description

  • Stage type name
  • Job export
  • Detailed Job log export (Print to file to export detailed job log)

DataStage stage type name, job export and the job log

Database job information

Run the commands beginning with SQL> using an application capable of running SQL commands against the database. For example, use SQL*Plus to run the SQL commands on an Oracle client machine.
On UNIX, use a $ character before environment variables.
On Windows, environment variables are surrounded by the % character.
The following tables use the UNIX syntax. Change this syntax if you are using Microsoft Windows.

Oracle

Command

Collects

SQL> select * from v$version

Returns the version of the Oracle database.

SQL> select * from nls_database_parameters

Returns NLS information for Oracle database

SQL> select dbms_metadata.get_ddl('<type>','<name>','<schema>') from dual;

Returns the object DDL. Run this command for all tables, views, indexes, and all objects associated with the DataStage job.

ls -Rl $ORACLE_HOME > oracle_client_files.txt

Returns file listing of the Oracle client

$ORACLE_HOME/bin/sqlplus -version

Returns the version of the client

DB2

Command

Collects
db2level Returns the version of the DB2 client or server.
db2look Returns the object DDL. Run this command for all tables, views, indexes, and all objects associated with the DataStage job.

 

Oracle Enterprise Stage information

Use the following commands to gather and generate the files described above.

AIX, Solarix, Linux, or HP-UX

Commands

Description

ls -l <IS_HOME>/Server/DSComponents/bin > <TEMP>/dscomplist.txt

List of libraries used. Send dscomplist.txt generated.

. <IS_HOME>/Server/DSEngine/dsenv

Linux:
gdb
`which osh` <IS_HOME>/Server/Projects/<project name>/<core_file_name>

gdb> where

AIX: Same as Linux but use dbx command instead of gdb.

Core files produced by jobs will be found in the project folder.
Define environment variable APT_NO_PM_SIGNAL_HANDLERS to allow core file generation. To get the stack trace from the core file, use dbx on AIX, or gdb on Linux. Run the where command after running dbx or gdb to obtain the stack trace. Copy the stack trace shown and send to IBM Support for analysis.

Windows

Commands

Description

dir /S <IS_HOME>\Server\DSComponents\bin > <TEMP>\dscomplist.txt

List of libraries used. Send dscomplist.txt generated.

 

Oracle OCI Plugin

AIX, Solarix, Linux, or HP-UX

Commands

Description

ls -l <IS_HOME>/Server/DSComponents/bin > <TEMP>/dscomp_ocilist.txt

List of libraries used. Send dscomp_ocilist.txt generated.

. <IS_HOME>/Server/DSEngine/dsenv
ldd <IS_HOME>/Server/DSComponents/bin/oraoci*

Library dependency check. Source dsenv, and then run ldd against each Oracle oci library.

Windows

Commands

Description

dir /S <IS_HOME>\Server\DSComponents\bin\oraoci*dll > <TEMP>\dscomp_ocilist.txt

List of libraries used. Send dscomp_ocilist.txt generated.

set PATH=<Path value from job log>

<PATH_TO_DEPENDS.EXE>\depends.exe <IS_HOME>\Server\DSComponents\bin\oraoci*dll

Library link check. Set environment variable PATH equal to the value of PATH from the DataStage job log and then run Dependency Walker on the library to check dependencies. Save the Dependency Walker image as a dwi file and send to IBM Support.

 

DRS Plugin

AIX, Solarix, Linux, or HP-UX

Commands

Description

ls -l <IS_HOME>/Server/DSComponents/bin/drs* > <TEMP>/dscomp_drslist.txt

List of libraries used. Send dscomp_drslist.txt generated.

. <IS_HOME>/Server/DSEngine/dsenv

ldd <IS_HOME>/Server/DSComponents/bin/drs*

Library dependency check. Source dsenv, and then run ldd against each Oracle oci library.

Windows

Commands

Description

dir /S <IS_HOME>\Server\DSComponents\bin\drs*dll > <TEMP>\dscomp_drslist.txt

List of libraries used. Send dscomp_drslist.txt generated.

set PATH=<Path value from job log>

<PATH_TO_DEPENDS.EXE>\depends.exe
<IS_HOME>\Server\DSComponents\bin\drs*dll

Library link check. Set environment variable PATH equal to the value of PATH from the DataStage job log and then run Dependency Walker on the library to check dependencies. Save the Dependency Walker image as a dwi file and send to IBM Support.

 

Oracle Connector

AIX, Solarix, Linux, or HP-UX

Commands

Description

ls -l <IS_HOME>/Server/DSComponents/bin/ccora* > <TEMP>/dscomp_ccoralist.txt

List of libraries used. Send dscomp_ccoralist.txt generated.

. <IS_HOME>/Server/DSEngine/dsenv

ldd <IS_HOME>/Server/DSComponents/bin/ccora*

Library dependency check. Source dsenv, and then run ldd against each Oracle oci library.

Windows

Commands

Description

dir /S <IS_HOME>\Server\DSComponents\bin\ccora*dll > <TEMP>\dscomp_ccoralist.txt

List of libraries used. Send dscomp_ccoralist.txt generated.

set PATH=<Path value from job log>

<PATH_TO_DEPENDS.EXE>\depends.exe <IS_HOME>\Server\DSComponents\bin\ccora*dll

Library link check. Set environment variable PATH equal to the value of PATH from the DataStage job log and then run Dependency Walker on the library to check dependencies. Save the Dependency Walker image as a dwi file and send to IBM Support.

 

19.3 SalesForce Connector

Return to Index

To provide IBM Support with sufficient information to debug a problem with SalesForce Connector, the following are the minimal collection requirements that should be provided:

  1. The exported job
  2. Director log
  3. Debugging Directory

Use the following commands to gather and generate the files described above.

Windows, AIX, Solarix, Linux, or HP-UX

Commands

Description

The Designer allows a job (such as a failing job) to be exported to a .dsx file. This can be imported by IBM Support to help understand the problem or reproduce it.
Specify the name of the file to be produced when you export the job.

The exported Job

Use the Director's log print dialog to produce a file containing the log of the failing job.
Specify the name of the file to be produced when you print the log.
Specify the "detailed" option in the print dialog.
The Director log
If a job fails consistently, and you are able to edit the job, collect a Debugging directory for the job. To create this directory:
  • Set the environment variables:
     DS_PXDEBUG=1
     APT_DUMP_SCORE=1
  • Re-run the job. A directory will be produced in
    <IS_HOME>/<project>/Debugging/<job name>

Debugging Directory

 

19.4 Connector Migration Tool (CMT)

Return to Index

To provide IBM Support with sufficient information to debug a problem with the Connector Migration Tool (CMT), the following are the minimal collection requirements that should be provided:

  1. Run the Connector Migration Tool (CMT) with the '-L' parameter to enable logging.
    Example:
    "C:\IBM\InformationServer\Clients\CCMigrationTool>CCMigration.exe -L <log file name>" (Where <log file name> is the name of a log file that will be created in the CCMigrationTool directory)
  2. Migrate the job and reproduce problem. Gather the files described below.

Windows, AIX, Solarix, Linux, or HP-UX

Files and Data to collect

Description

<IS_HOME>\Clients\CCMigrationTool\<log_file_name> Log file from the CMT tool
<IS_HOME>\Clients\CCMigrationTool\* (do not gather subfolders, *.exe, *.xml, and *.jar files). Additional CMT tool artifacts

 

20. Balanced Optimization

Return to Index

Windows users

Files and Data to collect

Description

<USER_HOME>\Application Data\IBM\Information Server\DataStage Client\
<ClientTagID>\BalOp\logs\*.*

The ClientTagID is located in the <IS_HOME>\Version.xml file,
for example:
<InstallType client="true"
clientTagId="652e77e8-ea1e-4850-a8b2-11e6a33dd884"
currentVersion="8.5.0.0" document="false" domain="true"
engine="true" repository="true"/>

 

21. Migration files to collect

Return to Index

Windows, AIX, Solarix, Linux, or HP-UX

Files and Data to collect

Description

<IS_HOME>\migration\migration.todo.txt

Created by the migration task. Describes the manual steps to be performed during migration.

 

22. XML Transformation information and files to collect

Return to Index

Windows only

Files and Data to collect

Description

Client only

 

<USER_HOME>\Application Data\Macromedia\Flash Player\Logs\flashlog.txt Log file for the DataStage XML Transformation UI Flex application. (Note this file will only exist when using the debug version of the Adobe Flash Player).

<USER_HOME>\ds_logs\xmlui_*.log.

Log file for the DataStage XML Transformation UI .NET application. Note: capture of all log files here already covered for main DS client collection.
A new log file is created every time the user invokes the DataStage XML Transformation connector application. The file name of a log begins with either "xmlui_assembly", or "xmlui_mdi_importer" followed by a time stamp indicating when the log was created (e.g., xmlui_assembly_2010.10.06_11.35.30.093.log).

Tip for troubleshooting:

The timestamp used in the log filename xmlui*.log is used to create the client correlation ID (clientid). It follows the TCP/IP address of the client. The UI logs this client ID in its log files, and gives it to the server. This enables correlation between a requesting client and the server-side services that process this client's requests. The server-side services use this clientid string when logging messages in the session used by the UI client. Also note that the DataStage XML Transformation Flex application log (flashlog.txt) also contains this same clientid.

Collecting XML Stage specific information

DataStage can create logging and tracing information at multiple layers and in different components. See section 7. DataStage (DS) information and files to collect

Client-Server tracing for DataStage Hierarchical feature

Increase the detail included in the client and server side log files for DataStage XML Transformation feature. This should only be performed under the direction of IBM Support.

Command

In the DataStage XML Transformation Flex UI, the main window contains an "Options" link in the upper right corner. Click that link to open a dialog and set serviceability options.

  1. Change the client logging level in the Logging section of this dialog. For example, change the level from "Info" to "Debug" to set trace. Log and trace messages are placed into the WAS logs on the server used by the UI. Server-side trace* must be set in order for trace messages to be written on the server. (Note, if a debug Adobe Flash Player is used, a local log file flashlog.txt is also written.)
  2. Change the server-side logging under Log/Trace Settings Server. For example, check the box labeled "Trace Enabled" to perform trace on the server-side.

Additionally trace can be enabled for both the Flex and .NET UI applications by changing a logging level property The default name and value of this property is: "xmlui.logging.level=INFO". The available levels are: ALL, DEBUG, INFO, WARNING, ERROR, FATAL, and OFF. To set trace, use a value of "DEBUG". This property is contained in the property file "xmlui.properties". This file is co-located with the connector executable in the directory <IS_HOME>\Clients\Classic.

Note that further configuration for Flex logging can be performed. That configuration file is at:<USER_HOME>\mm.cfg and is documented at the Adobe site.
Files and Data to collect:

Note:
To set server-side trace for the DataStage XML Transformation UI, use the WebSphere admin console to that server:

  1. On the left hand side of the WAS admin console, expand Troubleshooting and click on Logs and Trace.
  2. Click on the server1 server. Click on Diagnostic Trace.
  3. Click on the Change Log Detail Levels link.
  4. Go to the Runtime tab and find component "com.ibm.e2.applications.wisd.model.AssemblyDesigner".
  5. Set its trace level to 'finest'. Then click Apply.

 

23. Ops Console Workload Manager

Return to Index

Windows, AIX, Solarix, Linux, or HP-UX

Files and Data to collect

Description

<IS_HOME>\Server\DSWLM\logs\*
<IS_HOME>\Server\DSWLM\logs\wlm.properties
All logs and Workload Manager properties
<IS_HOME>\Server\DSWLM\dist\lib\wlm.config.xml Workload Manager configuration file. Check wlm.properties for possible different location of wlm.config.xml
<IS_HOME>\Server\DSWLM\start.wlm.log Workload Manager start log

 

24. Data Flow Designer (DFD)

Return to Index

Windows, AIX, Linux, zLinux

Files and Data to collect - Information Server 11.7 and above

Description

<IS_HOME>\ASBNode\CognitiveDesignerEngine\logs\jetty-*.request.log
<IS_HOME>\ASBNode\CognitiveDesignerEngine\logs\jetty_logging.log
<IS_HOME>\ASBNode\CognitiveDesignerEngine\logs\cognition-engine-connector.log
All logs from the Jetty Engine
WebSphere Application Server files to collect Logs from WebSphere Application Server

 

25. Unified Governance (UG)

Return to Index

Linux

Files and Data to collect - Information Server 11.7 and above

Description

On the Unified Governance (UG) Server, run
  /usr/bin/runUGDiagnostics.sh <file.zip>
to create a .ZIP file that includes collection of log files and a diagnostic summary html report file.
All logs from the Unified Governance (UG) server

 

 

What to do next: FTP collected data to IBM

Once you have collected the preceding information, submit the diagnostic information to IBM Support. You can submit files using one of following methods to help speed problem diagnosis: